Combining Nearest Neighbor Classifiers Versus Cross-Validation Selection

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Combining nearest neighbor classifiers versus cross-validation selection.

Various discriminant methods have been applied for classification of tumors based on gene expression profiles, among which the nearest neighbor (NN) method has been reported to perform relatively well. Usually cross-validation (CV) is used to select the neighbor size as well as the number of variables for the NN method. However, CV can perform poorly when there is considerable uncertainty in ch...

متن کامل

Complete Cross-Validation for Nearest Neighbor Classifiers

Cross-validation is an established technique for estimating the accuracy of a classifier and is normally performed either using a number of random test/train partitions of the data, or using kfold cross-validation. We present a technique for calculating the complete cross-validation for nearest-neighbor classifiers: i.e., averaging over all desired test/train partitions of data. This technique ...

متن کامل

Validation of nearest neighbor classifiers

We develop a probabilistic bound on the error rate of the nearest neighbor classiier formed from a set of labelled examples. The bound is computed using only the examples in the set. A subset of the examples is used as a validation set to bound the error rate of the classiier formed from the remaining examples. Then a bound is computed for the diierence in error rates between the original class...

متن کامل

Nearest Neighbor Classifiers

The 1-N-N classifier is one of the oldest methods known. The idea is extremely simple: to classify X find its closest neighbor among the training points (call it X ,) and assign to X the label of X .

متن کامل

Combining Nearest Neighbor Classifiers Through Multiple Feature Subsets

Combining multiple classi ers is an e ective technique for improving accuracy. There are many general combining algorithms, such as Bagging or Error Correcting Output Coding, that signi cantly improve classi ers like decision trees, rule learners, or neural networks. Unfortunately, many combining methods do not improve the nearest neighbor classi er. In this paper, we present MFS, a combining a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Statistical Applications in Genetics and Molecular Biology

سال: 2004

ISSN: 1544-6115

DOI: 10.2202/1544-6115.1054